Strong Optimality of the Normalized MLModels

نویسنده

  • J Rissanen
چکیده

We show that the normalized maximum likelihood (NML) distribution as a universal code for a parametric class of models is closest to the negative logarithm of the maximized likelihood in the mean code length distance, where the mean is taken with respect to the worst case model inside or outside the parametric class. We strengthen this result by showing that the same minmax bound results even when the data generating models are restricted to be most`benevolent' in minimizing the mean of the negative logarithm of the maximized likelihood. Further, we show for the class of exponential models that the bound cannot be beaten in essence by any code except when the mean is taken with respect to the most benevolent data generating models in a set of vanishing size. These results allow us to decompose the data into two parts, the rst having all the useful information that can be extracted with the parametric models and the rest which has none. We also show that, if we change Akaike's quest for the model in a parametric class that is closest to the data generating model in Kullback-Leibler distance to searching for the nearest universal model for the class, we obtain the MDL criterion rather than the AIC.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Strong convergence for variational inequalities and equilibrium problems and representations

We introduce an implicit method for nding a common element of the set of solutions of systems of equilibrium problems and the set of common xed points of a sequence of nonexpansive mappings and a representation of nonexpansive mappings. Then we prove the strong convergence of the proposed implicit schemes to the unique solution of a variational inequality, which is the optimality condition for ...

متن کامل

Sequential Optimality Conditions and Variational Inequalities

In recent years, sequential optimality conditions are frequently used for convergence of iterative methods to solve nonlinear constrained optimization problems. The sequential optimality conditions do not require any of the constraint qualications. In this paper, We present the necessary sequential complementary approximate Karush Kuhn Tucker (CAKKT) condition for a point to be a solution of a ...

متن کامل

Exchangeability Characterizes Optimality of Sequential Normalized Maximum Likelihood and Bayesian Prediction with Jeffreys Prior

We study online prediction of individual sequences under logarithmic loss with parametric constant experts. The optimal strategy, normalized maximum likelihood (NML), is computationally demanding and requires the length of the game to be known. We consider two simpler strategies: sequential normalized maximum likelihood (SNML), which computes the NML forecasts at each round as if it were the la...

متن کامل

Self-efficacy in Patients with Multiple Sclerosis: A Model Test Study

Background: Multiple Sclerosis (MS) is a common disease among youngsters and self-efficacy is a crucial factor in these patients. Various variables, including demographic characteristics and disease symptoms, affect self-efficacy. Therefore, it is necessary to assess the relationship between these factors using a clear and comprehensive model. Aim: this s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000